Jan 20, 2010
“Glaciergate"- Opinion: Sorry But This Stinks
By Roger Pielke Jr.
The IPCC treatment of Himalayan glaciers and its chairman’s conflicts of interest are related. The points and time line below are as I understand them and are informed by reporting by Richard North.
1. In 2007 the IPCC issues its Fourth Assessment Report which contains the false claim that the Himalayan glaciers are expected to disappear by 2035.
2. The basis for that statement was a speculative comment made to a reporter by Syed Hasnain in 1999, who was then (and after) a professor at Jawaharlal Nehru University in Delhi.
3. Following the publication of the IPCC report, and the widespread media coverage of the false claim about Himalayan glaciers, Dr. Hasnain joins TERI as a Senior Fellow, where Dr. Pachauri is the director.
4. Drs. Pachauri and Hasnain together seek to raise fund for TERI for work on Himalayan glaciers, justified by the work of the IPCC, according to Dr. Pachauri just last week:
Scientific data assimilated by IPCC is very robust and it is universally acknowledged that glaciers are melting because of climate change. The Energy & Resources Institute (TERI) in its endeavor to facilitate the development of an effective policy framework and their strategic implementation for the adaptation and mitigation of climate change impacts on the local population is happy to collaborate with the University of Iceland, Ohio State University and the Carnegie Corporation of New York.
5. When initially questioned about the scientific errors Dr. Pachauri calls such questions “voodoo science” in the days leading up to the announcement of TERI receiving funding on this subject. Earlier Dr. Pachauri criticized in the harshest terms the claims made by the Indian government that were contrary to those in the IPCC.
Pachauri said that such statements were reminiscent of “climate change deniers and school boy science”.
6. Subsequent to the error being more fully and publicly recognized, when asked by a reporter about the IPCC’s false claims Dr. Pachauri says that he has no responsibility for what Dr. Hasnain may have said, and Dr. Hasnain says, rather cheekily, the IPCC had no business citing his comments:
“It is not proper for IPCC to include references from popular magazines or newspapers.”
Of course, neither Dr. Pachauri nor Dr. Hasnain ever said anything about the error when it was receiving worldwide attention (as being true) in 2007 and 2008, nor did they raise any issues with the IPCC citing non-peer reviewed work (which is a systemic problem). They did however use the IPCC and its false claims as justification in support of fund raising for their own home institution. At no point was any of this disclosed.
If the above facts and time line is correct (and I welcome any corrects to details that I may have in error), then what we have here is a classic and unambiguous case of financial conflict of interest. IPCC Chairman Pachauri was making public comments on a dispute involving factual claims by the IPCC at the same time that he was negotiating for funding to his home institution justified by those very same claims. If instead of climate science we were instead discussing scientific advisors on drug safety and funding from a pharmaceutical company to the advisory committee chair the conflict would be obvious.
Climate science desperately needs to clean up its act. See Roger’s post here. See more in Richard North’s story “Pachauri: There’s Money in Them Glaciers” here.
------------------------
UN climate report riddled with errors on glaciers
By Seth Borenstein
Five glaring errors were discovered in one paragraph of the world’s most authoritative report on global warming, forcing the Nobel Prize-winning panel of climate scientists who wrote it to apologize and promise to be more careful. The errors are in a 2007 report by the Intergovernmental Panel on Climate Change, a U.N.-affiliated body. All the mistakes appear in a subsection that suggests glaciers in the Himalayas could melt away by the year 2035 - hundreds of years earlier than the data actually indicates. The year 2350 apparently was transposed as 2035.
The climate panel and even the scientist who publicized the errors said they are not significant in comparison to the entire report, nor were they intentional. And they do not negate the fact that worldwide, glaciers are melting faster than ever. But the mistakes open the door for more attacks from climate change skeptics.
“The credibility of the IPCC depends on the thoroughness with which its procedures are adhered to,” Yvo de Boer, head of the U.N. Framework Convention on Climate Change, told The Associated Press in an e-mail. “The procedures have been violated in this case. That must not be allowed to happen again because the credibility of climate change policy can only be based on credible science.” The incident follows a furor late last year over the release of stolen e-mails in which climate scientists talked about suppressing data and freezing out skeptics of global warming. And on top of that, an intense cold spell has some people questioning whether global warming exists.
In a statement, the climate change panel expressed regret over what it called “poorly substantiated estimates” about the Himalayan glaciers. “The IPCC has established a reputation as a real gold standard in assessment; this is an unfortunate black mark,” said Chris Field, a Stanford University professor who in 2008 took over as head of this part of the IPCC research. “None of the experts picked up on the fact that these were poorly substantiated numbers. From my perspective, that’s an area where we have an opportunity to do much better.”
Patrick Michaels, a global warming skeptic and scholar at the Cato Institute, a libertarian think tank, called on the head of the IPCC, Rajendra Pachauri, to resign, adding: “I’d like to know how such an absurd statement made it through the review process. It is obviously wrong.” However, a number of scientists, including some critics of the IPCC, said the mistakes do not invalidate the main conclusion that global warming is without a doubt man-made and a threat. The mistakes were found not by skeptics like Michaels, but by a few of the scientists themselves, including one who is an IPCC co-author.
The report in question is the second of four issued by the IPCC in 2007 on global warming. This 838-page document had chapters on each continent. The errors were in a half-page section of the Asia chapter. The section got it wrong as to how fast the thousands of glaciers in the Himalayas are melting, scientists said. “It is a very shoddily written section,” said Graham Cogley, a professor of geography and glaciers at Trent University in Peterborough, Canada, who brought the error to everyone’s attention. “It wasn’t copy-edited properly.”
Cogley, who wrote a letter about the problems to Science magazine that was published online Wednesday, cited these mistakes:
- The paragraph starts, “Glaciers in the Himalayas are receding faster than in any other part of the world.” Cogley and Michael Zemp of the World Glacier Monitoring System said Himalayan glaciers are melting at about the same rate as other glaciers.
- It says that if the Earth continues to warm, the “likelihood of them disappearing by the 2035 and perhaps sooner is very high.” Nowhere in peer-reviewed science literature is 2035 mentioned. However, there is a study from Russia that says glaciers could come close to disappearing by 2350. Probably the numbers in the date were transposed, Cogley said.
- The paragraph says: “Its total area will likely shrink from the present 500,000 to 100,000 square kilometers by the year 2035.” Cogley said there are only 33,000 square kilometers of glaciers in the Himalayas.
- The entire paragraph is attributed to the World Wildlife Fund, when only one sentence came from the WWF, Cogley said. And further, the IPCC likes to brag that it is based on peer-reviewed science, not advocacy group reports. Cogley said the WWF cited the popular science press as its source.
- A table says that between 1845 and 1965, the Pindari Glacier shrank by 2,840 meters. Then comes a math mistake: It says that’s a rate of 135.2 meters a year, when it really is only 23.5 meters a year.
Still, Cogley said: “I’m convinced that the great bulk of the work reported in the IPCC volumes was trustworthy and is trustworthy now as it was before the detection of this mistake.” He credited Texas state climatologist John Nielsen-Gammon with telling him about the errors.
However, Colorado University environmental science and policy professor Roger Pielke Jr. said the errors point to a “systematic breakdown in IPCC procedures,” and that means there could be more mistakes.
A number of scientists pointed out that at the end of the day, no one is disputing the Himalayan glaciers are shrinking.
“What is happening now is comparable with the Titanic sinking more slowly than expected,” de Boer said in his e-mail. “But that does not alter the inevitable consequences, unless rigorous action to reduce greenhouse gas emissions is taken.” Read more here.
More on “Glaciergate” here. See this story in the Australian “Heeding the political lessons of Glaciergate” here.
----------------------
Comments from IPCC Reviewer on Glaciergate
By Dr. Madhav Khandekar
Allow me some more comments on the “Glaciergate”
1. Dr R Pachauri IPCC Chair, in early December 2009 blasted India’s Environmental Minister Mr Jairam Ramesh in a public commentary reported in Times of India “ Mr Ramesh is being irresponsible for not heeding the IPCC ‘warning’ about Himalayan glaciers melting away by 2035” Mr Ramesh retorted that “this IPCC conclusion is NOT based on any iota of scientific evidence” It now turns out that Ramesh was right and Pachauri was Wrong! It is quite possible that Pachauri, in early December 2009, knew about the IPCC conclusion on Himalayan glaciers being wrong, but still chose to put heat on India for purely political reasons.
2. I reviewed the IPCC WGII Chapter 1, the largest and most important chapter of WGII in two stages FOD Nov 2005 and SOD July 2006. Chapter 1 makes a reference to glacier shrinkage etc in general and refers to Mt. Kilimanjaro “depleting rapidly’. That chapter completely ignores Georg Kaser’s excellent paper on Mt Kilimanjaro from Int’l J of Climatology 2004, which was readily available to IPCC authors. I pointed this out in my reviews, but IPCC authors made NO attempt to include Kaser’s reference nor many other omissions which I pointed out.
3. IPCC review process is flawed and it is time to emphasize that. IPCC 2007 Documents CANNOT be considered as peer-reviewed Documents. IPCC 2007 Documents should be treated as many other reports coming out of so many Universities and research organizations and NOTHING more!
4. The scientific community at large has given undue importance to IPCC Documents.
Madhav Khandekar IPCC Reviewer 2007 Climate Change. See Madhav’s November story which highlighted the exact problem now putting Pachauri pn the hot seat.
Jan 20, 2010
Michael Mann’s Climate Stimulus
Wall Street Journal
As for stimulus jobs - whether “saved” or “created” - we thought readers might be interested to know whose employment they are sustaining. More than $2.4 million is stimulating the career of none other than Penn State climate scientist Michael Mann.
Mr. Mann is the creator of the famous hockey stick graph, which purported to show some 900 years of minor temperature fluctuations, followed by a spike in temperatures over the past century. His work, which became a short-term sensation when seized upon by Al Gore, was later discredited. Mr. Mann made the climate spotlight again last year as a central player in the emails from the University of East Anglia’s Climatic Research Unit, which showed climatologists massaging data, squelching opposing views, and hiding their work from the public.
Mr. Mann came by his grants via the National Science Foundation, which received $3 billion in stimulus money. Last June, the foundation approved a $541,184 grant to fund work “Toward Improved Projections of the Climate Response to Anthropogenic Forcing,” which will contribute “to the understanding of abrupt climate change.” Principal investigator? Michael Mann.
He received another grant worth nearly $1.9 million to investigate the role of “environmental temperature on the transmission of vector-borne diseases.” Mr. Mann is listed as a “co-principal investigator” on that project. Both grants say they were “funded under the American Recovery and Reinvestment Act of 2009.”
The NSF made these awards prior to last year’s climate email scandal, but a member of its Office of Legislative and Public Affairs told us she was “unaware of any discussion regarding suspending or changing the awards made to Michael Mann.” So your tax dollars will continue to fund a climate scientist whose main contribution to the field has been to discredit climate science. Read more here.
Read more about Michael Mann in “Stimulating Fraud” in the IBD here.
Jan 18, 2010
Get ready for seven-foot sea level rise as climate change melts ice sheets
Rob Young and Orrin Pilkey for Yale Environment 360, part of the Guardian Environment Network
The reports from the Intergovernmental Panel on Climate Change (IPCC) are balanced and comprehensive documents summarizing the impact of global warming on the planet. But they are not without imperfections, and one of the most notable was the analysis of future sea level rise contained in the latest report, issued in 2007.
Given the complexities of forecasting how much the melting of the Greenland and West Antarctic ice sheets will contribute to increases in global sea level, the IPCC chose not to include these giant ice masses in their calculations, thus ignoring what is likely to be the most important source of sea level rise in the 21st century. Arguing that too little was understood about ice sheet collapse to construct a mathematical model upon which even a rough estimate could be based, the IPCC came up with sea level predictions using thermal expansion of the oceans and melting of mountain glaciers outside the poles. Its results were predictably conservative - a maximum of a two-foot rise this century - and were even a foot lower than an earlier IPCC report that factored in some melting of Greenland’s ice sheet.
The IPCC’s 2007 sea level calculations - widely recognized by the academic community (Yeah, WHERE???) as a critical flaw in the report - have caused confusion among many in the general public and the media and have created fodder for global warming skeptics. But there should be no confusion about the serious threat posed by rising sea levels, especially as evidence has mounted in the past two years of the accelerated pace of melting of the Greenland and West Antarctic ice sheets.
Most climate scientists believe melting of the Greenland Ice Sheet will be one of the main drivers of sea level rise during this century.
The message for the world’s leaders and decision makers is that sea level rise is real and is only going to get worse. Indeed, we make the case in our recent book, The Rising Sea, that governments and coastal managers should assume the inevitability of a seven-foot rise in sea level. This number is not a prediction. But we believe that seven feet is the most prudent, conservative long-term planning guideline for coastal cities and communities, especially for the siting of major infrastructure; a number of academic studies examining recent ice sheet dynamics have suggested that an increase of seven feet or more is not only possible, but likely. Certainly, no one should be expecting less than a three-foot rise in sea level this century.
In the 20th century, sea level rise was primarily due to thermal expansion of ocean water. Contributions of melting mountain glaciers and the large ice sheets were minor components. But most climate scientists now believe that the main drivers of sea level rise in the 21st century will be the melting of the West Antarctic Ice Sheet (a potential of a 16-foot rise if the entire sheet melts) and the Greenland Ice Sheet (a potential rise of 20 feet if the entire ice cap melts). The nature of the melting is non-linear and is difficult to predict.
Seeking to correct the IPCC’s failure to come up with a comprehensive forecast for sea level increase, a number of state panels and government committees have produced sea level rise predictions that include an examination of melting ice sheets. For example, sea level rise panels in Rhode Island and Miami-Dade County have concluded that a minimum of a three- to five-foot sea level rise should be anticipated by 2100. A California report assumes a possible 4.6-foot rise by 2100, while the Dutch assume a 2.5-foot rise by 2050 in the design of their tidal gates.
Given the growing consensus about the major sea level rise on the way in the coming century or two, the continued development of many low-lying coastal areas - including much of the U.S. east coast - is foolhardy and irresponsible.
Rising seas will be on the front lines of the battle against changing climate during the next century. Our great concern is that as the infrastructure of major cities in the industrialized world becomes threatened, there will be few resources left to address the dramatic impacts that will be facing the citizens of the developing world.
Pacific and Indian Ocean atoll nations are already being abandoned because of the direct and indirect effects of sea level rise, such as saltwater intrusion into groundwater. In the Marshall Islands, some crops are being grown in abandoned 55-gallon oil drums because the ground is now too salty for planting. New Zealand is accepting, on a gradual basis, all of the inhabitants of the Tuvalu atolls. Inhabitants of Carteret Atoll have all moved to Papua, New Guinea. The forward-looking government of the Maldives recently held a cabinet meeting underwater to highlight the ultimate fate of their small island nation.
The world’s major coastal cities will undoubtedly receive most of the attention as sea level rise threatens infrastructure. Miami tops the list of most endangered cities in the world, as measured by the value of property that would be threatened by a three-foot rise. This would flood all of Miami Beach and leave downtown Miami sitting as an island of water, disconnected from the rest of Florida. Other threatened U.S. cities include New York/Newark, New Orleans, Boston, Washington, Philadelphia, Tampa-St Petersburg, and San Francisco. Osaka/Kobe, Tokyo, Rotterdam, Amsterdam, and Nagoya are among the most threatened major cities outside of North America.
Preserving coastal cities will require huge public expenditures, leaving smaller coastal resort communities to fend for themselves. Manhattan, for example, is likely to beat out Nags Head, North Carolina for federal funds, a fact that recreational beach communities must recognize when planning a response to sea level rise.
Twelve percent of the world’s open ocean shorelines are fronted by barrier islands, and a three-foot sea level rise will spell doom for development on most of them - save for those completely surrounded by massive seawalls.
Impacts in the United States, with a 3,500-mile long barrier island shoreline extending from Montauk Point on Long Island to the Mexican border, will be huge. The only way to preserve the barrier islands themselves will be to abandon them so that they may respond naturally to rising sea level. Yet, most coastal states continue to allow massive, irresponsible development of the low-lying coast.
Ironically, low-elevation Florida is probably the least prepared of all coastal states. Hundreds of miles of high rises line the state’s shoreline, and more are built every year. The state pours subsidies into coastal development through state-run insurance and funding for coastal protection. If a portion of those funds were spent adapting to sea level rise rather than ignoring it, Florida might be ready to meet the challenge of the next century. Let’s hope the state rises to the challenge.
Despite the dire facts, the next century of rising sea level need not be an economic disaster. Thoughtful planning can lead to a measured retreat from vulnerable coastal lowlands. We recommend the following:
Immediately prohibit the construction of high-rise buildings and major infrastructure in areas vulnerable to future sea level rise. Buildings placed in future hazardous zones should be small and movable - or disposable.
Relocation of buildings and infrastructure should be a guiding philosophy. Instead of making major repairs on infrastructure such as bridges, water supply, and sewer and drainage systems, when major maintenance is needed, go the extra mile and place them out of reach of the sea. In our view, no new sewer and water lines should be introduced to zones that will be adversely affected by sea level rise in the next 50 years. Relocation of some beach buildings could be implemented after severe storms or with financial incentives.
Stop government assistance for oceanfront rebuilding. The guarantee of recovery is perhaps the biggest obstacle to a sensible response to sea level rise. The goal in the past has always been to restore conditions to what they were before a storm or flood. In the United States, hurricanes have become urban renewal programs. The replacement houses become larger and larger and even more costly to replace again in the future. Those who invest in vulnerable coastal areas need to assume responsibility for that decision. If you stay, you pay.
After years of reluctance, scientists and governments are now looking to adaptation measures as critical for confronting the consequences of climate change. And increasingly, plans are being developed to deal with rising seas, water shortages, spreading diseases, and other realities of a warming world.
Local governments cannot be expected to take the lead. The problems created by sea level rise are international and national, not local, in scope. Local governments of coastal towns (understandably) follow the self-interests of coastal property owners and developers, so preservation of buildings and maintaining tax base is inevitably a very high priority. In addition, the resources needed to respond to sea level rise will be far beyond those available to local communities.
Responding to long-term sea level rise will pose unprecedented challenges to the international community. Economic and humanitarian disasters can be avoided, but only through wise, forward-looking planning. Tough decisions will need to be made regarding the allocation of resources and response to natural disasters. Let us hope that our political leadership can provide the bold vision and strong leadership that will be required to implement a reasoned response. Read more of this ridiculous story here.
Icecap Note: Another “its far worse than even the IPCC thought” story. They did not mention James Hansen who a year ago claimed sea levels would rise 246 feet (important to be precise). Not his first wild forecast. In 1988 Hansen was discussing with an underling how the area around Columbia U would change in 20 years. He expected a lot more traffic because “traffic would be diverted because the major highway near the river would have flooded” The story has it he claimed it would rise 3 feet by 2008. Sea level has changed less than 1 inch.
Jan 16, 2010
Professional Discourtesy By The National Climate Data Center On The Menne Et Al 2010 paper
By Roger Pielke Sr., Climate Science
The professional courtesy when researchers collect data is to permit them first opportunity to publish. The National Institute of Health (NIH) has written up this policy. The NIH writes in their section on timeliness of data sharing
“Recognizing that the value of data often depends on their timeliness, data sharing should occur in a timely fashion. NIH expects the timely release and sharing of data to be no later than the acceptance for publication of the main findings from the final dataset.”
The NIH writes with respect to their grantees: “In general, grantees own the rights in data resulting from a grant-supported project.” NIH has just written down what is the professional courtesy with respect to data
In the case of the site data that Anthony Watts has collected at considerable effort on his part and that of his outstanding volunteers (see), the National Climate Data Center (NCDC) is not recognizing this professional courtesy. They already earlier have posted a (flawed) analysis of a subset of Anthony’s data (see). Simply recognizing Anthony’s pivotal role in identifying the current site characteristics of the USHCN sites, as listed in the Acknowledgements of the Menne et al (2009) paper (and the new JGR paper), is hardly adequate.
Despite the proper collegial approach to scientific interaction, and in contrast to the NIH policy, they have prematurely published a paper using a subset of the site classifications that Anthony has completed (and, moreover, the site classification data they used has not even gone through final quality assurance checks!) . They used only ~40% of the USHCN sites yet over 87% have actually been surveyed by Anthony’s volunteers.
The Editor who oversaw this paper is also to blame for the early appearance of this article. I was quite surprised to learn that despite the central role of Andrew Watt’s analysis in the paper, he was not asked to be a referee of the paper. This is inappropriate and suggests the Editor did not provide a balanced review process.
The new paper which analyzes a subset of the available site data is
Menne, M. J., C. N. Williams, and M. A. Palecki (2010): On the reliability of the U.S. Surface Temperature Record, J. Geophys. Res., doi:10.1029/2009JD013094, in press. (accepted 7 January 2010)
with the abstract
Recent photographic documentation of poor siting conditions at stations in the U.S. Historical Climatology Network (USHCN) has led to questions regarding the reliability of surface temperature trends over the conterminous U.S. (CONUS). To evaluate the potential impact of poor siting/instrument exposure on CONUS temperatures, trends derived from poor and well-sited USHCN stations were compared. Results indicate that there is a mean bias associated with poor exposure sites relative to good exposure sites; however, this bias is consistent with previously documented changes associated with the widespread conversion to electronic sensors in the USHCN during the last 25 years. Moreover, the sign of the bias is counterintuitive to photographic documentation of poor exposure because associated instrument changes have led to an artificial negative ("cool") bias in maximum temperatures and only a slight positive ("warm") bias in minimum temperatures. These results underscore the need to consider all changes in observation practice when determining the impacts of siting irregularities.
Further, the influence of non-standard siting on temperature trends can only be quantified through an analysis of the data. Adjustments applied to USHCN Version 2 data largely account for the impact of instrument and siting changes, although a small overall residual negative ("cool") bias appears to remain in the adjusted maximum temperature series. Nevertheless, the adjusted USHCN temperatures are extremely well aligned with recent measurements from instruments whose exposure characteristics meet the highest standards for climate monitoring. In summary, we find no evidence that the CONUS temperature trends are inflated due to poor station siting.”
We will discuss the science of the analysis in a subsequent post and a paper which is being prepared for submission. However, this post is about the process of compromising the standard scientific method, similar to what was revealed in several of the CRU e-mails. This same culture exists at NCDC under the direction of Tom Karl.
The publication of the Menne et al 2010 paper violates the professional courtesy that is standard practice by other scientific groups. We had even offered them co-authorship on our papers, so that we can benefit from their scientific expertise and they can benefit from ours. They refused.
This failure by NCDC to honor professional standards is just another example of the lack of accepted professional standards at this federal climate laboratory. They should have joined us in a paper, or, as an appropriate alternative, waited until we published and then complete their analysis.
See Roger’s post here.
Jan 16, 2010
Climategate: The Perils of Global Warming Models
By John Droz, Jr., Pajamas Media
Everyone readily admits that things aren’t always what they seem. But are we really applying this knowledge in our daily dealings? Are we consciously ferreting out the illusory from the reality? I think not.
For instance, despite overwhelming evidence to the contrary, we aren’t really being run by pandering politicians, self-serving lobbyists, fanatical environmentalists, and greedy Wall Street manipulators. They are the illusion.
There is another even more powerful (but much less visible) agent behind all of these puppets. The person behind the screen is the computer programmer. And, just like in the Wizard of OZ, they do not want you to look at this real controller.
I’ll probably have to turn in my membership card, but as a computer programmer (and physicist and environmental activist) I’m here to spill the beans about the Wiz. The first hint of trouble is spelled out in Wikipedia’s explanation about computer programmers:
The discipline differs from many other technical professions in that programmers generally do not need to be licensed or pass any standardized (or governmentally regulated) certification tests in order to call themselves “programmers” or even “software engineers.”
Hmmm.
My layperson explanation is that computer programming is all about making assumptions, and then converting these into mathematical equations. The big picture question is this: Is it really possible to accurately convert complex real-world situations into ones and zeros? Hal may think so, but higher processing brains say no. Yet this is continuously attempted, with very limited success. Let’s pull the screen back a bit more. We’ll start with an example about how such a model makes assumptions.
One of the computer programs I wrote was for debt collectors. A typical scenario was that a debtor was given a date to make a payment and the collection company didn’t receive it on time. What response is then appropriate? In such a circumstance the computer program typically makes an automatic contact with the debtor. (Remember there are thousands of these debtors, and it would be prohibitively time consuming for an agency person to manually check into and follow up each case.) So what to say in this correspondence to the debtor? Well, it comes down to the assumptions made by the computer programmer.
The programmer tries to simplify such situations into mathematical options. In this case they may decide that the question is: “Does the debtor have the money to make this payment: yes or no?” This relatively basic choice then leads to a Boolean progression within the program. How does the programmer (model) decide on yes or no? Well, other indicators would be used (e.g., were prior payments made on time) to come up with a statistical probability.
Of course, any computer model is not one set of choices, but rather a whole series of yes/no (if/or) calculations that lead to a conclusion. In a complex situation (e.g., debt collection, climate change, or financial derivatives) there could easily be a hundred such choices to deal with.
To understand the implications of that, let’s just consider the case where there are ten such decision points - each with a “yes” or “no” answer. At the end of such a pipeline, that means that there are 210 (i.e., 1024) possible results. That’s a lot of different potential conclusions.
Unfortunately, there are actually many more possibilities! The assumption that this debtor situation could be condensed down to a “yes” or “no” answer is not accurate. There are several other real situations that fall outside of “yes” or “no.” For instance, what if the debtor never got a notice in the first place that the amount was due by the date the agency is monitoring? Or what if the debtor sent the money and it got lost in transition? Or what if the debtor made the payment to the original person they owed, rather than the collection agency? Or what if the debtor sent in the money on time, and the collection agency incorrectly didn’t credit the debtor for the payment? Etc., etc.
For the computer program (model) to be accurate, all of these scenarios need to be able to be handled properly (legally, timely, etc.). Can you begin to see the complexity here, just with this very simple example of a payment not being received on time?
There is still another significant factor (we’re up to #4 now) not mentioned yet. What about the situation where the debtor hasn’t paid, but it’s because his child has MS, and he has no insurance? How does a computer programmer write code for more abstract concepts, like “fairness”? In other words, can ones and zeros be arranged in such a way to represent intangibles? I think not.
So the bottom line question is this: Is there any way that a computer program can correctly handle all of these real-world possibilities - even in this simple debt collection case? The answer is no. We have considerable difficulties just translating the relatively simple thing we call language - e.g., Greek biblical texts into English. How many versions of the Bible are there? Why isn’t there just one?
Can we possibly hope to translate a process much more complicated than just words? We can certainly try, but clearly the answer is that there is a lot lost in the translation of any complex scenario (debtors, energy performance, etc.) into mathematical equations and computer code.
Some uninformed parties believe that the user has control of all the variables, and can manually (and accurately) change scenarios. That is incorrect, as the user-controlled elements only represent a small fraction of the actual number of factors that are built into the computer model. A similar fallacy is to think something like “we know the assumptions that the programmers made, and are adjusting accordingly.” This is wrong.
In writing a computer program of any complexity, there are literally hundreds of assumptions made. The computer programmer does not reveal all these to his customer, for much the same reasons that an accountant does not tell his client all of the assumptions made in preparing a tax return. He goes over a few of the more basic items, and then says “sign here.”
Oh, yes, this example brings up still another major variable (#7): the data the programmer uses as the basis for his creation. Just like preparing a tax return depends on two parties working together, writing a computer model is a collaboration between scientist and programmer. If the taxpayer gives incomplete or inaccurate data to the accountant, the result will be wrong. What’s disconcerting is that in many cases, neither party will know that the results are in error.
Similarly if the scientist (inadvertently) gives incomplete or inaccurate data to the programmer to use in his creation, the result will likewise be wrong. And neither party will know it. There is still one more significant variable (#8) that we have to take into account. After a computer model is generated, there is an interpreter (e.g., IPCC) that translates the “results” for politicians and the public (i.e., the media).
Here’s a surprise: These public interpretations are influenced by such factors as political, religious, environmental, financial, and scientific opinions. In their public revelations, do the interpreters explain all of their underlying biases? By now you know the answer: absolutely not. When these are introduced into the equation we obviously have strayed so far from scientific fact that it is not even in sight anymore.
So we need to think very carefully before we take major actions (e.g., spend a few trillion dollars based on climate predictions, wind energy projected performance, etc.) that are almost entirely based on computer models. What to do? Should we just scrap all computer models? No, that’s the other extreme. Computer models have merit - but shouldn’t be the tail wagging the dog.
We should realistically see computer models for what they are - tools to assist us in organizing our thoughts, and producers of highly subjective results that are simply starting points for real scientific analysis. Because of their inherent limitations (which I’ve just touched on here) all computer models should be treated with a very healthy degree of skepticism.
To insure appropriate integrity, all computer models regarding matters of importance should be subjected to the rigors of scientific methodology. If they can’t accurately and continuously replicate the results of real-world data, then they should be discarded. Unfortunately, that is not what is happening. We have gotten so addicted to the illusion that these programs are accurate - and some have become so agenda driven - that we are now adjusting or discarding real-world data that doesn’t agree with the model. This is insane. If a model has not been proven to fully reflect reality, then it has very limited use and should be treated with the same degree of consideration that one might give a horoscope. See post here.
|